AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multi-task understanding

# Multi-task understanding

AMD OLMo 1B
Apache-2.0
AMD-OLMo is a series of 1-billion-parameter language models trained from scratch by AMD on the AMD Instinct™ MI250 GPU.
Large Language Model Safetensors
A
amd
4,419
25
Qwen2 1.5B Ita
Apache-2.0
Qwen2 1.5B is a compact language model specifically optimized for Italian, with performance close to ITALIA (iGenius) but 6 times smaller in size.
Large Language Model Transformers Supports Multiple Languages
Q
DeepMount00
6,220
21
Mlm Spanish Roberta Base
Spanish pretrained language model based on RoBERTa architecture, focusing on masked language modeling tasks
Large Language Model Transformers Spanish
M
MMG
21
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase